On efficiently combining limited-memory and trust-region techniques

نویسندگان

  • Oleg Burdakov
  • Lujin Gong
  • Spartak Zikrin
  • Ya-Xiang Yuan
چکیده

Limited memory quasi-Newton methods and trust-region methods represent two efficient approaches used for solving unconstrained optimization problems. A straightforward combination of them deteriorates the efficiency of the former approach, especially in the case of large-scale problems. For this reason, the limited memory methods are usually combined with a line search. We show how to efficiently combine limited memory and trust-region techniques. One of our approaches is based on the eigenvalue decomposition of the limited memory quasi-Newton approximation of the Hessian matrix. The decomposition allows for finding a nearly-exact solution to the trust-region subproblem defined by the Euclidean norm with an insignificant computational overhead as compared with the cost of computing the quasi-Newton direction in line-search limited memory methods. The other approach is based on two new eigenvalue-based norms. The advantage of the new norms is that the trust-region subproblem is separable and each of the smaller subproblems is easy to solve. We show that our eigenvalue-based limited-memory trust-region methods are globally convergent. Moreover, we propose improved versions of the existing limited-memory trust-region algorithms. The presented results of numerical experiments demonstrate the efficiency of our approach which is competitive with line-search versions of the L-BFGS method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A limited memory adaptive trust-region approach for large-scale unconstrained optimization

This study concerns with a trust-region-based method for solving unconstrained optimization problems. The approach takes the advantages of the compact limited memory BFGS updating formula together with an appropriate adaptive radius strategy. In our approach, the adaptive technique leads us to decrease the number of subproblems solving, while utilizing the structure of limited memory quasi-Newt...

متن کامل

روش به روز رسانی متقارن از مرتبه اول برای حل مسایل بهینه سازی مقیاس بزرگ

The search for finding the local minimization in unconstrained optimization problems and a fixed point of the gradient system of ordinary differential equations are two close problems. Limited-memory algorithms are widely used to solve large-scale problems, while Rang Kuta's methods are also used to solve numerical differential equations. In this paper, using the concept of sub-space method and...

متن کامل

A Limited Memory Bfgs Algorithm with Super Relaxation Technique for Nonlinear Equations

In this paper, a trust-region algorithm combining with the limited memory BFGS (L-BFGS) update is proposed for solving nonlinear equations, where the super relaxation technique(SRT) is used. We choose the next iteration point by SRT. The global convergence without the nondegeneracy assumption is obtained under suitable conditions. Numerical results show that this method is very effective for la...

متن کامل

Limited Memory Bfgs Updating in a Trust–region Framework

The limited memory BFGS method pioneered by Jorge Nocedal is usually implemented as a line search method where the search direction is computed from a BFGS approximation to the inverse of the Hessian. The advantage of inverse updating is that the search directions are obtained by a matrix–vector multiplication. In this paper it is observed that limited memory updates to the Hessian approximatio...

متن کامل

On Efficiently Computing the Eigenvalues of Limited-Memory Quasi-Newton Matrices

In this paper, we consider the problem of efficiently computing the eigenvalues of limited-memory quasi-Newton matrices that exhibit a compact formulation. In addition, we produce a compact formula for quasi-Newton matrices generated by any member of the Broyden convex class of updates. Our proposed method makes use of efficient updates to the QR factorization that substantially reduces the cos...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Math. Program. Comput.

دوره 9  شماره 

صفحات  -

تاریخ انتشار 2017